In linear algebra, the outer product typically refers to the tensor product of two vectors. The result of applying the outer product to a pair of vectors is a matrix. The name contrasts with the inner product, which takes as input a pair of vectors and produces a scalar.
The outer product of vectors can be also regarded as a special case of the Kronecker product of matrices.
Some authors use the expression "outer product of tensors" as a synonym of "tensor product". The outer product is also a higher-order function in some computer programming languages such as APL and Mathematica.
Contents |
Given a vector with m elements and a vector with n elements, their outer product is defined as the matrix obtained by multiplying each element of by each element of :
Note that
For complex vectors, it is customary to use the complex conjugate of (denoted ). Namely, matrix is obtained by multiplying each element of by the complex conjugate of each element of .
The outer product as defined above is equivalent to a matrix multiplication , provided that is represented as a column vector and as a column vector (which makes a row vector). For instance, if and
For complex vectors, it is customary to use the conjugate transpose of (denoted ):
If m = n, then one can take the matrix product the other way, yielding a scalar (or matrix):
which is the standard inner product for Euclidean vector spaces, better known as the dot product. The inner product is the trace of the outer product.
Let V and W be two vector spaces, and let W* be the dual space of W. Given a vector x ∈ V and y* ∈ W*, then the tensor product y* ⊗ x corresponds to the map A : W → V given by
Here y*(w) denotes the value of the linear functional y* (which is an element of the dual space of W) when evaluated at the element w ∈ W. This scalar in turn is multiplied by x to give as the final result an element of the space V.
Thus intrinsically, the outer product is defined for a vector and a covector; to define the outer product of two vectors requires converting one vector to a covector (in coordinates, transpose), which one can do in the presence of a bilinear form generally taken to be a nondegenerate form (meaning this is an isomorphism) or more narrowly an inner product.
If V and W are finite-dimensional, then the space of all linear transformations from W to V, denoted Hom(W,V), is generated by such outer products; in fact, the rank of a matrix is the minimal number of such outer products needed to express it as a sum (this is the tensor rank of a matrix). In this case Hom(W,V) is isomorphic to W* ⊗ V.
If , then one can also pair the covector w*∈V* with the vector v∈V via , which is the duality pairing between V and its dual, sometimes called the inner product.
The outer product on tensors is typically referred to as the tensor product. Given a tensor a with rank q and dimensions (i 1, ..., i q), and a tensor b with rank r and dimensions (j 1, ..., j r), their outer product c has rank q+r and dimensions (k 1, ..., k q+r) which are the i dimensions followed by the j dimensions. For example, if A has rank 3 and dimensions (3, 5, 7) and B has rank 2 and dimensions (10, 100), their outer product c has rank 5 and dimensions (3, 5, 7, 10, 100). If A[2, 2, 4] = 11 and B[8, 88]= 13 then C[2, 2, 4, 8, 88] = 143. .
To understand the matrix definition of outer product in terms of the definition of tensor product:
The term "rank" is used here in its tensor sense, and should not be interpreted as matrix rank.
The outer product is useful in computing physical quantities (e.g., the tensor of inertia), and performing transform operations in digital signal processing and digital image processing. It is also useful in statistical analysis for computing the covariance and auto-covariance matrices for two random variables.
|